12 research outputs found

    Characterization of the co-seismic slip field for large earthquakes

    Get PDF
    Focused studies on large earthquakes have highlighted that ruptures on generating faults are strongly heterogeneous. The main aim of this thesis is to explore characteristic patterns of the slip distribution of large earthquakes, by using the finite-fault models (FFM) obtained in the last 25 years. A result of this thesis is the computation of regression laws linking focal parameters with magnitude. Particular attention was devoted to the aspect ratio (A.R.), defined as the ratio between fault length and width. FFMs have been partitioned in 3 A.R. classes and for each class, the position of the hypocentre, of the maximum slip and their mutual relation were investigated. To favour inter-comparison, normalised images of on-fault seismic slip were produced on geometries typical of each A.R. class with the goal of finding possible regularities of the slip distribution shapes. In this thesis, the shape of the single-asperity FFMs has been fitted by means of 2D Gaussian distributions. To the knowledge of the author, this thesis is the first example of a systematic study of finite-fault solutions to identify the main pattern of the on-fault co-seismic slip of large earthquakes. One of the foreseen applications is related to tsunamigenesis. It is known that tsunamis are mainly determined by the vertical displacement of the seafloor induced by large submarine earthquakes. In this thesis, the near-source vertical-displacement fields produced by the FFM (assumed as the real one), by a homogeneous fault model, by a depth-heterogeneous fault model and by two distinct 2D Gaussian distribution fault models were computed and compared for all single-asperity earthquakes. The main finding is that 2D Gaussian distributions give the least misfit fields and are therefore the most adequate for tsunami generation modelling

    Towards the new thematic Core service Tsunami within the EPOS research infrastructure

    Get PDF
    Tsunamis constitute a significant hazard for European coastal populations, and the impact of tsunami events worldwide can extend well beyond the coastal regions directly affected. Understanding the complex mechanisms of tsunami generation, propagation, and inundation, as well as managing the tsunami risk, requires multidisciplinary research and infrastructures that cross national boundaries. Recent decades have seen both great advances in tsunami science and consolidation of the European tsunami research community. A recurring theme has been the need for a sustainable platform for coordinated tsunami community activities and a hub for tsunami services. Following about three years of preparation, in July 2021, the European tsunami community attained the status of Candidate Thematic Core Service (cTCS) within the European Plate Observing System (EPOS) Research Infrastructure. Within a transition period of three years, the Tsunami candidate TCS is anticipated to develop into a fully operational EPOS TCS. We here outline the path taken to reach this point, and the envisaged form of the future EPOS TCS Tsunami. Our cTCS is planned to be organised within four thematic pillars: (1) Support to Tsunami Service Providers, (2) Tsunami Data, (3) Numerical Models, and (4) Hazard and Risk Products. We outline how identified needs in tsunami science and tsunami risk mitigation will be addressed within this structure and how participation within EPOS will become an integration point for community developmen

    Enabling dynamic and intelligent workflows for HPC, data analytics, and AI convergence

    Get PDF
    The evolution of High-Performance Computing (HPC) platforms enables the design and execution of progressively larger and more complex workflow applications in these systems. The complexity comes not only from the number of elements that compose the workflows but also from the type of computations they perform. While traditional HPC workflows target simulations and modelling of physical phenomena, current needs require in addition data analytics (DA) and artificial intelligence (AI) tasks. However, the development of these workflows is hampered by the lack of proper programming models and environments that support the integration of HPC, DA, and AI, as well as the lack of tools to easily deploy and execute the workflows in HPC systems. To progress in this direction, this paper presents use cases where complex workflows are required and investigates the main issues to be addressed for the HPC/DA/AI convergence. Based on this study, the paper identifies the challenges of a new workflow platform to manage complex workflows. Finally, it proposes a development approach for such a workflow platform addressing these challenges in two directions: first, by defining a software stack that provides the functionalities to manage these complex workflows; and second, by proposing the HPC Workflow as a Service (HPCWaaS) paradigm, which leverages the software stack to facilitate the reusability of complex workflows in federated HPC infrastructures. Proposals presented in this work are subject to study and development as part of the EuroHPC eFlows4HPC project.This work has received funding from the European High-Performance Computing Joint Undertaking (JU) under grant agreement No 955558. The JU receives support from the European Union’s Horizon 2020 research and innovation programme and Spain, Germany, France, Italy, Poland, Switzerland and Norway. In Spain, it has received complementary funding from MCIN/AEI/10.13039/501100011033, Spain and the European Union NextGenerationEU/PRTR (contracts PCI2021-121957, PCI2021-121931, PCI2021-121944, and PCI2021-121927). In Germany, it has received complementary funding from the German Federal Ministry of Education and Research (contracts 16HPC016K, 6GPC016K, 16HPC017 and 16HPC018). In France, it has received financial support from Caisse des dépôts et consignations (CDC) under the action PIA ADEIP (project Calculateurs). In Italy, it has been preliminary approved for complimentary funding by Ministero dello Sviluppo Economico (MiSE) (ref. project prop. 2659). In Norway, it has received complementary funding from the Norwegian Research Council, Norway under project number 323825. In Switzerland, it has been preliminary approved for complimentary funding by the State Secretariat for Education, Research, and Innovation (SERI), Norway. In Poland, it is partially supported by the National Centre for Research and Development under decision DWM/EuroHPCJU/4/2021. The authors also acknowledge financial support by MCIN/AEI /10.13039/501100011033, Spain through the “Severo Ochoa Programme for Centres of Excellence in R&D” under Grant CEX2018-000797-S, the Spanish Government, Spain (contract PID2019-107255 GB) and by Generalitat de Catalunya, Spain (contract 2017-SGR-01414). Anna Queralt is a Serra Húnter Fellow.With funding from the Spanish government through the ‘Severo Ochoa Centre of Excellence’ accreditation (CEX2018-000797-S)

    Towards the new Thematic Core Service Tsunami within the EPOS Research Infrastructure

    Get PDF
    Tsunamis constitute a significant hazard for European coastal populations, and the impact of tsunami events worldwide can extend well beyond the coastal regions directly affected. Understanding the complex mechanisms of tsunami generation, propagation, and inundation, as well as managing the tsunami risk, requires multidisciplinary research and infrastructures that cross national boundaries. Recent decades have seen both great advances in tsunami science and consolidation of the European tsunami research community. A recurring theme has been the need for a sustainable platform for coordinated tsunami community activities and a hub for tsunami services. Following about three years of preparation, in July 2021, the European tsunami community attained the status of Candidate Thematic Core Service (cTCS) within the European Plate Observing System (EPOS) Research Infrastructure. Within a transition period of three years, the Tsunami candidate TCS is anticipated to develop into a fully operational EPOS TCS. We here outline the path taken to reach this point, and the envisaged form of the future EPOS TCS Tsunami. Our cTCS is planned to be organised within four thematic pillars: (1) Support to Tsunami Service Providers, (2) Tsunami Data, (3) Numerical Models, and (4) Hazard and Risk Products. We outline how identified needs in tsunami science and tsunami risk mitigation will be addressed within this structure and how participation within EPOS will become an integration point for community development

    Towards the new Thematic Core Service Tsunami within the EPOS Research Infrastructure

    Get PDF
    Tsunamis constitute a significant hazard for European coastal populations, and the impact of tsunami events worldwide can extend well beyond the coastal regions directly affected. Understanding the complex mechanisms of tsunami generation, propagation, and inundation, as well as managing the tsunami risk, requires multidisciplinary research and infrastructures that cross national boundaries. Recent decades have seen both great advances in tsunami science and consolidation of the European tsunami research community. A recurring theme has been the need for a sustainable platform for coordinated tsunami community activities and a hub for tsunami services. Following about three years of preparation, in July 2021, the European tsunami community attained the status of Candidate Thematic Core Service (cTCS) within the European Plate Observing System (EPOS) Research Infrastructure. Within a transition period of three years, the Tsunami candidate TCS is anticipated to develop into a fully operational EPOS TCS. We here outline the path taken to reach this point, and the envisaged form of the future EPOS TCS Tsunami. Our cTCS is planned to be organised within four thematic pillars: (1) Support to Tsunami Service Providers, (2) Tsunami Data, (3) Numerical Models, and (4) Hazard and Risk Products. We outline how identified needs in tsunami science and tsunami risk mitigation will be addressed within this structure and how participation within EPOS will become an integration point for community development.publishedVersio

    Dislocazioni in un semispazio elastico con superficie libera

    Get PDF
    Questa tesi tratta di problemi dislocativi in spazi elastici, utili per rappresentare campi di spostamento, deformazione e sforzo generati da sorgenti interne. In particolare, considerando la Terra come un corpo elastico, i problemi trattati si riferiscono alla descrizione dei campi di sforzo in prossimità della superficie terrestre, lo studio della cui deformazione rappresenta uno dei più utili metodi di indagine delle sorgenti sismiche e vulcaniche. È possibile, con ottima approssimazione, descrivere la superficie terrestre come una superficie libera, ovvero una superficie su cui le trazioni applicate dall'esterno sono nulle. Tale approssimazione, per il tipo di studio cui mira questa trattazione, è giustificata dal fatto che le sorgenti sismiche sono situate solitamente a diversi chilometri di profondità, dove risulta legittimo trascurare le forze esterne dovute a pressione atmosferica, azione di correnti d'aria, forze di gravità esercitata dal Sole ed altri corpi celesti, etc. Volendo studiare regioni con dimensioni lineari molto inferiori al raggio terrestre, è possibile approssimare la Terra ad un semispazio elastico omogeneo con superficie libera orizzontale. Nel seguito si farà riferimento ad un sistema cartesiano ortogonale con assi y e z diretti lungo l'orizzontale e asse x diretto verticalmente verso l'interno del semispazio. La superficie terrestre è perciò descritta dal piano x=0 e, denotando con T_ij il tensore di sforzo, la condizione di superficie libera risulta: Txx = Txy = Txz = 0; in x = 0. Nella tesi sono esposti alcuni metodi che consentono di estendere soluzioni già note per spazi elastici illimitati al caso di semispazi delimitati da una superficie libera

    Analysis of slip distribution of large earthquakes oriented to tsunamigenesis characterisation

    No full text
    The present thesis focuses on the on-fault slip distribution of large earthquakes in the framework of tsunami hazard assessment and tsunami warning improvement. It is widely known that ruptures on seismic faults are strongly heterogeneous. In the case of tsunamigenic earthquakes, the slip heterogeneity strongly influences the spatial distribution of the largest tsunami effects along the nearest coastlines. Unfortunately, after an earthquake occurs, the so-called finite-fault models (FFM) describing the coseismic on-fault slip pattern becomes available over time scales that are incompatible with early tsunami warning purposes, especially in the near field. Our work aims to characterize the slip heterogeneity in a fast, but still suitable way. Using finite-fault models to build a starting dataset of seismic events, the characteristics of the fault planes are studied with respect to the magnitude. The patterns of the slip distribution on the rupture plane, analysed with a cluster identification algorithm, reveal a preferential single-asperity representation that can be approximated by a two-dimensional Gaussian slip distribution (2D GD). The goodness of the 2D GD model is compared to other distributions used in literature and its ability to represent the slip heterogeneity in the form of the main asperity is proven. The magnitude dependence of the 2D GD parameters is investigated and turns out to be of primary importance from an early warning perspective. The Gaussian model is applied to the 16 September 2015 Illapel, Chile, earthquake and used to compute early tsunami predictions that are satisfactorily compared with the available observations. The fast computation of the 2D GD and its suitability in representing the slip complexity of the seismic source make it a useful tool for the tsunami early warning assessments, especially for what concerns the near field

    A heuristic features selection approach for scenario analysis in a regional seismic probabilistic tsunami hazard assessment

    No full text
    International audienceSeismic Probabilistic Tsunami Hazard Analysis (SPTHA) is aimed at estimating the annual rate of exceedance of an earthquake-induced tsunami wave of a certain location with reference to a predefined height threshold. The analysis relies on computationally demanding numerical simulations of seismic-induced tsunami wave generation and propagation. A large number of scenarios needs to be simulated to account for uncertainties. However, the exceedance of tsunami wave threshold height is a rare event so that most of the simulated scenarios bring little statistical contribution to the estimation of the annual rate yet increasing the computational burden. To efficiently address this issue, we propose a wrapper-based heuristic approach to select the set of most relevant features of the seismic model, for deciding a priori the seismic scenarios to be simulated. The proposed approach is based on a Multi-Objective Differential Evolution Algorithm (MODEA) and is developed with reference to a case study whose objective is calculating the annual rate of threshold exceedance of the height of tsunami waves caused by subduction earthquakes that might be generated on a section of the Hellenic Arc, and propagated to a set of target sites: Siracusa, on the eastern coast of Sicily, Crotone, on the southern coast of Calabria, and Santa Maria di Leuca, on the southern coast of Puglia. The results show that, in all cases, the proposed approach allows a reduction of 95% of the number of scenarios with half of the features to be considered, and with no appreciable loss of accuracy

    Enabling dynamic and intelligent workflows for HPC, data analytics, and AI convergence

    No full text
    The evolution of High-Performance Computing (HPC) platforms enables the design and execution of progressively larger and more complex workflow applications in these systems. The complexity comes not only from the number of elements that compose the workflows but also from the type of computations they perform. While traditional HPC workflows target simulations and modelling of physical phenomena, current needs require in addition data analytics (DA) and artificial intelligence (AI) tasks. However, the development of these workflows is hampered by the lack of proper programming models and environments that support the integration of HPC, DA, and AI, as well as the lack of tools to easily deploy and execute the workflows in HPC systems. To progress in this direction, this paper presents use cases where complex workflows are required and investigates the main issues to be addressed for the HPC/DA/AI convergence. Based on this study, the paper identifies the challenges of a new workflow platform to manage complex workflows. Finally, it proposes a development approach for such a workflow platform addressing these challenges in two directions: first, by defining a software stack that provides the functionalities to manage these complex workflows; and second, by proposing the HPC Workflow as a Service (HPCWaaS) paradigm, which leverages the software stack to facilitate the reusability of complex workflows in federated HPC infrastructures. Proposals presented in this work are subject to study and development as part of the EuroHPC eFlows4HPC project
    corecore